Estimation Error of the Lasso

نویسندگان

  • Nissim Zerbib
  • Yen-Huan Li
  • Ya-Ping Hsieh
  • Volkan Cevher
چکیده

This paper presents an upper bound for the estimation error of the constrained lasso, under the high-dimensional (n < p) setting. In contrast to existing results, the error bound in this paper is sharp, is valid when the parameter to be estimated is not exactly sparse (e.g., when the parameter is weakly sparse), and shows explicitly the effect of over-estimating the `1-norm of the parameter to be estimated on the estimation performance of the lasso. The results of this paper show that the lasso is minimax optimal for estimating a parameter with bounded `1-norm, and if the exact value of the `1-norm of the parameter to be estimated is accessible, the lasso is also minimax optimal for estimating a weakly sparse parameter.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Differenced-Based Double Shrinking in Partial Linear Models

Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can ...

متن کامل

Mammalian Eye Gene Expression Using Support Vector Regression to Evaluate a Strategy for Detecting Human Eye Disease

Background and purpose: Machine learning is a class of modern and strong tools that can solve many important problems that nowadays humans may be faced with. Support vector regression (SVR) is a way to build a regression model which is an incredible member of the machine learning family. SVR has been proven to be an effective tool in real-value function estimation. As a supervised-learning appr...

متن کامل

The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)

Abstract: We revisit the adaptive Lasso as well as the thresholded Lasso with refitting, in a high-dimensional linear model, and study prediction error, lq-error (q ∈ {1, 2}), and number of false positive selections. Our theoretical results for the two methods are, at a rather fine scale, comparable. The differences only show up in terms of the (minimal) restricted and sparse eigenvalues, favor...

متن کامل

Prediction of Rural Residents’ Consumption Expenditure Based on Lasso and Adaptive Lasso Methods

When the variable of model is large, the Lasso method and the Adaptive Lasso method can effectively select variables. This paper prediction the rural residents’ consumption expenditure in China, based on respectively using the Lasso method and the Adaptive Lasso method. The results showed that both can effectively and accurately choose the appropriate variable, but the Adaptive Lasso method is ...

متن کامل

Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression

Abstract: We study the asymptotic properties of Lasso+mLS and Lasso+ Ridge under the sparse high-dimensional linear regression model: Lasso selecting predictors and then modified Least Squares (mLS) or Ridge estimating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+ mLS and Lasso+Ridge. Second, we der...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016